OPTML 2017:Variable Metric Proximal Gradient Method with Diagonal Borzilai-Borwein Stepsize

نویسندگان

  • Youngsuk Park
  • Stephen Boyd
  • Sauptik Dhar
  • Mohak Shah
چکیده

We propose a diagonal metric selection for variable metric proximal gradient method (VMPG). The proposed metric better captures the local geometry of the problem and provides improved convergence compared to the standard proximal gradient (PG) methods with Barzilai-Borwein (BB) stepsize selection. Further, we provide convergence guarantees for the proposed method and illustrate its advantages over traditional PG with (BB) through empirical results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bregman operator splitting with variable stepsize for total variation image reconstruction

This paper develops a Bregman operator splitting algorithm with variable stepsize (BOSVS) for solving problems of the form min{φ(Bu)+ 1/2‖Au− f ‖2}, where φ may be nonsmooth. The original Bregman Operator Splitting (BOS) algorithm employed a fixed stepsize, while BOSVS uses a line search to achieve better efficiency. These schemes are applicable to total variation (TV)-based image reconstructio...

متن کامل

The cyclic Barzilai–Borwein method for unconstrained optimization

In the cyclic Barzilai–Borwein (CBB) method, the same Barzilai–Borwein (BB) stepsize is reused for m consecutive iterations. It is proved that CBB is locally linearly convergent at a local minimizer with positive definite Hessian. Numerical evidence indicates that when m > n/2 3, where n is the problem dimension, CBB is locally superlinearly convergent. In the special case m = 3 and n = 2, it i...

متن کامل

Adaptive ADMM with Spectral Penalty Parameter Selection

The alternating direction method of multipliers (ADMM) is a versatile tool for solving a wide range of constrained optimization problems, with differentiable or non-differentiable objective functions. Unfortunately, its performance is highly sensitive to a penalty parameter, which makes ADMM often unreliable and hard to automate for a non-expert user. We tackle this weakness of ADMM by proposin...

متن کامل

A non-monotonic method for large-scale non-negative least squares

We present a new algorithm for nonnegative least-squares (NNLS). Our algorithm extends the unconstrained quadratic optimization algorithm of Barzilai and Borwein (BB) (J. Barzilai and J. M. Borwein; Two-Point Step Size Gradient Methods. IMA J. Numerical Analysis; 1988.) to handle nonnegativity constraints. Our extension differs in several basic aspects from other constrained BB variants. The mo...

متن کامل

Multivariate spectral gradient method for unconstrained optimization

Multivariate spectral gradient method is proposed for solving unconstrained optimization problems. Combined with some quasi-Newton property multivariate spectral gradient method allows an individual adaptive stepsize along each coordinate direction, which guarantees that the method is finitely convergent for positive definite quadratics. Especially, it converges no more than two steps for posit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017